Article 9321

Title of the article

METHOD OF PREDICTION OF VIDEO SEQUENCE FRAMES BASED ON GENERATIVE NEURAL NETWORKS 

Authors

Valeriya D. Sazykina, Postgraduate student, Penza State University (40 Krasnaya street, Penza, Russia), E-mail: sazykinavd@yandex.ru
Maxim A. Mitrokhin, Doctor of technical sciences, professor, head of the sub-department of computer engineering, Penza State University (40 Krasnaya street, Penza, Russia), E-mail: mmax83@mail.ru 

Index UDK

004.931 

DOI

10.21685/2227-8486-2021-3-9 

Abstract

Background. The main disadvantages of traditional approaches to detecting moving objects in a video stream are considered. The need for new approaches based on indepth learning is justified. Materials and methods. As a promising direction in solving the problem of detecting moving objects in a video stream, the use of a generative adversarial network is proposed. To preserve semantically in the process of normalization a method of spatial-adaptive normalization is proposed. Together with the method of spatial-adaptive normalization, it is proposed to use the method of semantic segmentation and the method of estimating optical flow. Results. As a result of the research, a method of forecasting video frames was developed. It is proposed to use Multi-SPADE blocks, and the repeated application of the "Devon" network of deformation volumes to the predicted frame and the real one, adjacent in time. Conclusions. The proposed method for predicting frames of a video sequence can be used for constructing a method for detecting moving objects. 

Key words

deep learning, generative adversarial network, spatial-adaptive normalization, space map, semantic segmentation, label embedding network, optical flow, deformable volume network 

 Download PDF
For citation

Sazykina V.D., Mitrokhin M.A. Method of prediction of video sequence frames based on generative neural networks. Modeli, sistemy, seti v ekonomike, tekhnike, prirode i obshchestve = Models, systems, networks in eco nomics, technology, nature and society. 2021;(3): 91–97. (In Russ.). doi:10.21685/2227-8486-2021-3-9

 

Дата создания: 13.12.2021 14:37
Дата обновления: 06.04.2022 13:41